7 research outputs found

    Non-Gaussian Component Analysis using Entropy Methods

    Full text link
    Non-Gaussian component analysis (NGCA) is a problem in multidimensional data analysis which, since its formulation in 2006, has attracted considerable attention in statistics and machine learning. In this problem, we have a random variable XX in nn-dimensional Euclidean space. There is an unknown subspace Γ\Gamma of the nn-dimensional Euclidean space such that the orthogonal projection of XX onto Γ\Gamma is standard multidimensional Gaussian and the orthogonal projection of XX onto Γ⊥\Gamma^{\perp}, the orthogonal complement of Γ\Gamma, is non-Gaussian, in the sense that all its one-dimensional marginals are different from the Gaussian in a certain metric defined in terms of moments. The NGCA problem is to approximate the non-Gaussian subspace Γ⊥\Gamma^{\perp} given samples of XX. Vectors in Γ⊥\Gamma^{\perp} correspond to `interesting' directions, whereas vectors in Γ\Gamma correspond to the directions where data is very noisy. The most interesting applications of the NGCA model is for the case when the magnitude of the noise is comparable to that of the true signal, a setting in which traditional noise reduction techniques such as PCA don't apply directly. NGCA is also related to dimension reduction and to other data analysis problems such as ICA. NGCA-like problems have been studied in statistics for a long time using techniques such as projection pursuit. We give an algorithm that takes polynomial time in the dimension nn and has an inverse polynomial dependence on the error parameter measuring the angle distance between the non-Gaussian subspace and the subspace output by the algorithm. Our algorithm is based on relative entropy as the contrast function and fits under the projection pursuit framework. The techniques we develop for analyzing our algorithm maybe of use for other related problems

    ICA based on a Smooth Estimation of the Differential Entropy

    Get PDF
    In this paper we introduce the MeanNN approach for estimation of main information theoretic measures such as differential entropy, mutual information and divergence. As opposed to other nonparametric approaches the MeanNN results in smooth differentiable functions of the data samples with clear geometrical interpretation. Then we apply the proposed estimators to the ICA problem and obtain a smooth expression for the mutual information that can be analytically optimized by gradient descent methods. The improved performance of the proposed ICA algorithm is demonstrated on several test examples in comparison with state-ofthe-art techniques.
    corecore